skip to main content


Search for: All records

Creators/Authors contains: "Johnson, W."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Advances in understanding the effects the mechanical characteristics of prosthetic feet on user biomechanics have enabled passive prostheses to improve the walking pattern of people with lower limb amputation. However, there is no consensus on the design methodology and criteria required to maximize specific user outcomes and fully restore their mobility. The Lower Leg Trajectory Error (LLTE) framework is a novel design methodology based on the replication of lower leg dynamics. The LLTE value evaluates how closely a prosthetic foot replicates a target walking pattern. Designing a prosthesis that minimizes the LLTE value, optimizes its mechanical function to enable users to best replicate the target lower leg trajectory. Here, we conducted a systematic sensitivity investigation of LLTE-optimized prostheses. Five people with unilateral transtibial amputation walked overground at self-selected speeds using five prototype energy storage and return feet with varying LLTE values. The prototypes' LLTE values were varied by changing the stiffness of the participant's LLTE-optimized design by 60%, 80%, 120%, and 167%. Users most closely replicated the target able-bodied walking pattern with the LLTE-optimized stiffness, experimentally demonstrating that the predicted optimum was a true optimum. Additionally, the predicted LLTE values were correlated to the user's ability to replicate the target walking pattern, user preferences, and clinical outcomes including roll-over geometries, trunk sway, prosthetic energy return, and peak push-off power. This study further validates the use of the LLTE framework as a predictive and quantitative tool for designing and evaluating prosthetic feet. 
    more » « less
  2. Abstract A novel, high-performance, cosmetic, rugged, appropriately costed, and mass-manufacturable prosthetic foot for use in low-income countries was designed and field tested. This ruggedized foot was created to accommodate the unique economic, environmental, and cultural requirements for users in India. A previous prototype that enabled able-bodied like gait was modified to include a durable cosmetic cover without altering the tuned stiffness of the overall foot. After undergoing mechanical benchtop testing, the foot was distributed to prosthesis users in India to for at least 5 months. Afterward, participants underwent clinical tests to evaluate walking performance, and additional benchtop testing was performed on the field-tested feet to identify changes in performance. The ruggedized foot endured 1 × 106 fatigue cycles without failure and demonstrated the desired stiffness properties. Subjects walked significantly faster (0.14 m/s) with the ruggedized foot compared to the Jaipur foot, and the feet showed no visible sign of damage after months of use. Additionally, the field-tested feet showed little difference in stiffness from a set of unused controls. Anecdotal feedback from the participants indicated that the foot improved their speed and/or walking effort, but may benefit from more degrees-of-freedom about the ankle. The results suggest that the foot fulfills its design requirements; however, further field testing is required with more participants over a longer period to make sure the foot is suitable for use in developing countries. 
    more » « less
  3. Abstract The walking pattern and comfort of a person with lower limb amputation are determined by the prosthetic foot’s diverse set of mechanical characteristics. However, most design methodologies are iterative and focus on individual parameters, preventing a holistic design of prosthetic feet for a user’s body size and walking preferences. Here we refined and evaluated the lower leg trajectory error (LLTE) framework, a novel quantitative and predictive design methodology that optimizes the mechanical function of a user’s prosthesis to encourage gait dynamics that match their body size and desired walking pattern. Five people with unilateral below-knee amputation walked over-ground at self-selected speeds using an LLTE-optimized foot made of Nylon 6/6, their daily-use foot, and a standardized commercial energy storage and return (ESR) foot. Using the LLTE feet, target able-bodied kinematics and kinetics were replicated to within 5.2% and 13.9%, respectively, 13.5% closer than with the commercial ESR foot. Additionally, energy return and center of mass propulsion work were 46% and 34% greater compared to the other two prostheses, which could lead to reduced walking effort. Similarly, peak limb loading and flexion moment on the intact leg were reduced by an average of 13.1%, lowering risk of long-term injuries. LLTE-feet were preferred over the commercial ESR foot across all users and preferred over the daily-use feet by two participants. These results suggest that the LLTE framework could be used to design customized, high performance ESR prostheses using low-cost Nylon 6/6 material. More studies with large sample size are warranted for further verification. 
    more » « less
  4. Tensegrity robots, which are composed of compressive elements (rods) and flexible tensile elements (e.g., cables), have a variety of advantages, including flexibility, low weight, and resistance to mechanical impact. Nevertheless, the hybrid soft-rigid nature of these robots also complicates the ability to localize and track their state. This work aims to address what has been recognized as a grand challenge in this domain, i.e., the state estimation of tensegrity robots through a markerless, vision-based method, as well as novel, onboard sensors that can measure the length of the robot's cables. In particular, an iterative optimization process is proposed to track the 6-DoF pose of each rigid element of a tensegrity robot from an RGB-D video as well as endcap distance measurements from the cable sensors. To ensure that the pose estimates of rigid elements are physically feasible, i.e., they are not resulting in collisions between rods or with the environment, physical constraints are introduced during the optimization. Real-world experiments are performed with a 3-bar tensegrity robot, which performs locomotion gaits. Given ground truth data from a motion capture system, the proposed method achieves less than 1~cm translation error and 3 degrees rotation error, which significantly outperforms alternatives. At the same time, the approach can provide accurate pose estimation throughout the robot's motion, while motion capture often fails due to occlusions. 
    more » « less
  5. Abstract Background

    Blood-based biomarkers for diagnosing active tuberculosis (TB), monitoring treatment response, and predicting risk of progression to TB disease have been reported. However, validation of the biomarkers across multiple independent cohorts is scarce. A robust platform to validate TB biomarkers in different populations with clinical end points is essential to the development of a point-of-care clinical test. NanoString nCounter technology is an amplification-free digital detection platform that directly measures mRNA transcripts with high specificity. Here, we determined whether NanoString could serve as a platform for extensive validation of candidate TB biomarkers.

    Methods

    The NanoString platform was used for performance evaluation of existing TB gene signatures in a cohort in which signatures were previously evaluated on an RNA-seq dataset. A NanoString codeset that probes 107 genes comprising 12 TB signatures and 6 housekeeping genes (NS-TB107) was developed and applied to total RNA derived from whole blood samples of TB patients and individuals with latent TB infection (LTBI) from South India. The TBSignatureProfiler tool was used to score samples for each signature. An ensemble of machine learning algorithms was used to derive a parsimonious biomarker.

    Results

    Gene signatures present in NS-TB107 had statistically significant discriminative power for segregating TB from LTBI. Further analysis of the data yielded a NanoString 6-gene set (NANO6) that when tested on 10 published datasets was highly diagnostic for active TB.

    Conclusions

    The NanoString nCounter system provides a robust platform for validating existing TB biomarkers and deriving a parsimonious gene signature with enhanced diagnostic performance.

     
    more » « less
  6. null (Ed.)
    The benefit of integrating batches of genomic data to increase statistical power is often hindered by batch effects, or unwanted variation in data caused by differences in technical factors across batches. It is therefore critical to effectively address batch effects in genomic data to overcome these challenges. Many existing methods for batch effects adjustment assume the data follow a continuous, bell-shaped Gaussian distribution. However in RNA-seq studies the data are typically skewed, over-dispersed counts, so this assumption is not appropriate and may lead to erroneous results. Negative binomial regression models have been used previously to better capture the properties of counts. We developed a batch correction method, ComBat-seq, using a negative binomial regression model that retains the integer nature of count data in RNA-seq studies, making the batch adjusted data compatible with common differential expression software packages that require integer counts. We show in realistic simulations that the ComBat-seq adjusted data results in better statistical power and control of false positives in differential expression compared to data adjusted by the other available methods. We further demonstrated in a real data example that ComBat-seq successfully removes batch effects and recovers the biological signal in the data. 
    more » « less
  7. Genomic data are often produced in batches due to practical restrictions, which may lead to unwanted variation in data caused by discrepancies across processing batches. Such "batch effects" often have negative impact on downstream biological analysis and need careful consideration. In practice, batch effects are usually addressed by specifically designed software, which merge the data from different batches, then estimate batch effects and remove them from the data. Here we focus on classification and prediction problems, and propose a different strategy based on ensemble learning. We first develop prediction models within each batch, then integrate them through ensemble weighting methods. In contrast to the typical approach of removing batch effects from the merged data, our method integrates predictions rather than data. We provide a systematic comparison between these two strategies, using studies targeting diverse populations infected with tuberculosis. In one study, we simulated increasing levels of heterogeneity across random subsets of the study, which we treat as simulated batches. We then use the two methods to develop a genomic classifier for the binary indicator of disease status. We evaluate the accuracy of prediction in another independent study targeting a different population cohort. We observed a turning point in the level of heterogeneity, after which our strategy of integrating predictions yields better discrimination in independent validation than the traditional method of integrating the data. These observations provide practical guidelines for handling batch effects in the development and evaluation of genomic classifiers. 
    more » « less